![]() ENVIRONMENTAL SENSOR SYSTEM
专利摘要:
A system (100) for detecting a characteristic of an object. The system (100) includes at least one projector (101) for projecting a radiation pattern in a pulsed mode, the pattern comprising a spatial distribution of spots emitted simultaneously. The system (100) also includes a detector (102) having a plurality of pixels for detecting a reflected radiation pattern. It includes a processor (103) for processing detected data based on a spot shift calculation and a synchronization means (104) that interfaces between the detector (102) and the radiation source (101) and for synchronizing the pulses from the radiation source (101). radiation source (101) with sampling from the detector (102). 公开号:BE1021971B1 申请号:E2014/0536 申请日:2014-07-09 公开日:2016-01-29 发明作者:Den Bossche Johan Van;Dyck Dirk Van 申请人:Xenomatix Nv;Xenomatix Bvba; IPC主号:
专利说明:
Ambient sensor system Domain of the invention The invention relates to the domain of characterizing a scene or a part thereof or an object. More specifically, the invention relates to systems and methods for detecting the characteristic, e.g., a profile or characteristic, of an object or person in the vicinity of a vehicle or of the vehicle itself. BACKGROUND OF THE INVENTION There are a large number of applications where knowledge of the 3D profile of an object is relevant. There are various techniques for scanning the profile of an object. They can in principle be classified into radar-based systems, ultrasonic-based systems and optical detection systems. Radar-based systems have the advantage that they can scan a large range but have the disadvantage that they have a weak angle and depth resolution with regard to certain applications (eg for tracing the profile in a road). Ultrasonic-based systems can be useful for short-range scanning, but their narrow bandwidth limits depth sensitivity and sampling resolution, and the strong absorption in air limits the range to a few meters. Optical detection-based methods can be subdivided into different types that measure the distance by means of transit time measurements or by triangular measurement. In run-time based methods, the object is illuminated by a light source. The distance traveled by the light can be determined from the delay between the radiation and the detection of the reflection. The runtime-based methods may use pulsed exposure. In triangle measurement based systems, the unknown position of an object is calculated using triangle measurement. An example of such a system is the "Kinect system" from Microsoft, described in US8320621. In this system, structured infrared light (eg circles) is projected and observed with a 3D camera. This system, which is primarily intended for game and entertainment applications in an indoor environment, is not suitable for outdoor use, given the intensity of the sunlight. In stereo vision, the distance to an object is determined from the local shift between corresponding parts in the images obtained by two cameras at different angles or by one stereo camera with two lenses. Stereo vision-based systems can use existing robot vision setups and algorithms, can work using ambient light, and do not require projection. On the other hand, stereovision-based systems have the disadvantage that calibrated cameras with sufficient distance are needed. Furthermore, sufficient structure is needed in the images to allow cross-parallax correlation, it is difficult to detect flat surfaces and water, a sufficient number of pixels is required, the depth sensitivity is limited, and the cameras used must have a large dynamic range to cope with various lighting conditions. The biggest stumbling block appears to be that stereovision-based systems cannot work if there is insufficient structure in the object being scanned. Therefore, there is still room for improvement of ambient sensor systems that can be used in outdoor situations for scanning the profile of objects over a large distance with a high resolution and at a high speed. Summary of the invention It is an object of embodiments of the present invention to provide good systems and methods for determining a characteristic of an object, a scene or a portion thereof. In specific embodiments, the characteristic may be a profile of an object, e.g., the profile of a road. It is an advantage of embodiments of the present invention that it is robust to the variation in light conditions that may occur in an outdoor environment, such as daylight and / or rain. Furthermore, it is an advantage of embodiments of the present invention that it is robust to the light of other vehicles. It is an advantage of embodiments of the present invention that scanning is possible over a range of 1 to 15 m, in some embodiments even over a range of 1 to 30 m, and in some embodiments up to 200 m. By enabling a range from up to 200 m, embodiments of the present invention are particularly suitable for use in autonomous vehicles. The maximum range can be improved by using better cameras and lasers. Use can be made of the advantages of using semiconductor technology to produce such components. The accuracy of the profile determination depends on the range to be scanned. In embodiments of the present invention, an accuracy of 1/1000 of the range can be obtained. By "accuracy" is meant here the distance accuracy between the car and the road. The "local height of the road" vertical accuracy can even be 10 times better. It is an advantage of the embodiments of the present invention that a viewing angle of 1 radial horizontal and 1 radial vertical can be obtained. Other horizontal and vertical viewing angles can be selected depending on the application. Furthermore, if larger viewing angles are required, multiple systems can be combined. It is an advantage of embodiments of the present invention that they are robust to vibration. It is an advantage of embodiments of the present invention that components used in systems of the present invention generally have a long service life. It is an advantage of embodiments of the present invention that the average radiation power is less than 1 mW per spot (or spot or light spot). When a radiation pattern of 100 by 100 spots is used, this results in a total average radiation power of 10 W. The radiation threshold can therefore be in accordance with generally applied safety regulations. Furthermore, it is an advantage of embodiments of the present invention that the power consumption of systems according to embodiments of the present invention is low. This is particularly important in a vehicle environment where the vehicle's power system may not be overloaded by the sensor system. It is an advantage of embodiments of the present invention that they can be easily installed and that their alignment is simple, and can even be automated. The initial alignment can be carried out, for example, by scanning a flat surface and recording the positions of the projected spots as an initial reference. Any change in the relative position between the projector and detector can be easily detected by observing the projected pattern as a whole. It is an advantage of embodiments of the present invention that a lightweight and compact system can be provided. It is an advantage of embodiments of the present invention that an inexpensive system can be obtained, since it can be based, for example, on components that can be made using standard processing techniques. It is an advantage of at least some embodiments of the present invention that no mechanical scanning of object to be studied is required, resulting in a mechanically less complex system. In addition, it is an advantage that the basic components can be available components such as, for example, CMOS and CCD cameras and laser arrays. These components are available immediately. When multiple cameras are used in a system, use can be made of one or more radiation splitters, such as a semi-transparent mirror, a beam splitter, a beam splitting cube, etc., for aligning the different cameras. It is an advantage of embodiments of the present invention that for the basic components such as e.g. CMOS and CCD cameras and laser arrays there is a constant increase in performance, and a decrease in cost. It is an advantage of embodiments of the present invention that the performance is scalable. It is an advantage of embodiments of the present invention that the system design and component selection can give it reliable operation regardless of weather conditions, e.g. when the system operates at night, in the rain, in fog, regardless of the quality of the road, regardless of the surface material used, etc. Systems according to embodiments of the present invention are very robust, and provide reliable operation in many different environmental conditions. The operation can be virtually independent of the road surface, the type of material, the weather, day or night, etc. It is an advantage of embodiments of the present invention that the measured and / or processed data is used for saving energy and / or for optimizing energy consumption in the characterization system, e.g. the active spring system. Without changing the basic principles of the method, the performance is already increased by using purely newer versions of cameras and lasers. For example, in 3 years, the resolution of CMOS cameras has risen from 1 megapixel to 4 megapixels, at the same price. There are various trade-offs that can be adjusted depending on the application requirements (eg trade-off between power and field of vision), resulting in optimum characteristics for specific applications. In at least some embodiments of the present invention, the object whose characteristic, e.g., a profile or characteristic, is to be determined is a road ahead of the vehicle. In some embodiments of the present invention, information about the road ahead of the vehicle is used to control a vehicle's suspension system. It is an advantage of embodiments of the present invention, especially when they are used in automotive applications, that they still function at speeds of 50 m / s. It is an advantage of embodiments of the present invention that in addition to information from the observed object such as a profile, various other parameters can also be derived. For example, when applied in an automobile environment, the 3D orientation of a car, the speed of the car, the presence of approaching cars or other objects, the presence of water or the road surface, etc. can also be obtained. The above objective is achieved by a method and system according to the present invention. The present invention relates to a system for detecting a characteristic, e.g. a profile or characteristic, of an object, the system comprising at least one projector, which comprises, for example, one or more radiation sources, for projecting an irradiation pattern that comprises a spatial distribution of simultaneously emitted spots, the projector being designed to operate in a pulsed mode, wherein at least one detector contains a plurality of pixels for detecting a reflection of the irradiation pattern reflected by an object, a processor for processing of the detected data detected by the at least one detector, the processor being configured to process the detected data based on spot shift calculations, and a synchronizing means that is interfaced between the at least one detector and the at least one radiation source and that is provided for synchronizing of the at least one detector with the at least one projector so that detection by the detector of radiation to be processed is only detected during the radiation pulses. The projector can be based on a radiation source that provides monochromatic radiation in the near-infrared spectrum. The at least one projector may comprise at least one laser and at least one optical grid for generating a plurality of laser spots from the irradiation pattern. It is an advantage of embodiments of the present invention that efficient phase gratings can be obtained, wherein the gratings comprise a system of grooves with a specific depth in a specific transparent material. In an advantageous embodiment, a set of horizontal and vertical grooves can be used. The design is also flexible so that it can be optimized with regard to the design specification. The phase grid can be a discrete phase grid such as, but not limited to, a dammann grid. The laser spots can be irradiated on the object. The projector can include a semiconductor laser. The projector can include a multi-laser radiation source. The projector can include a single VCSEL source or a VCSEL array. The projector may include projector optics for projecting the irradiation pattern in the required field of view. The at least one radiation source and the shutter may be provided to transmit pulses with pulse widths in the microsecond range. The processor may be provided for determining an characteristic of an object by determining a displacement of detected spots detected with the at least one detector with respect to predetermined reference spot positions. The predetermined reference spot positions may, for example, be determined in a calibration phase. The projector may include projector optics for projecting the irradiation pattern in the required field of view. The projector optics may comprise microarray optics for separately projecting different spots of the irradiation pattern and / or wherein the projector optics comprises a diverging lens system for widening the irradiation pattern to achieve a predetermined field of view. The projector can be configured to be mechanically oriented to the field of interest. The projector can be configured to project a subset of spots of the irradiation pattern. The projector operating in pulsed mode may be adapted to sequentially repeat projection of the irradiation pattern of spots or a portion thereof. The projector can be configured to individually control spots or subgroups of spots of the irradiation pattern with respect to their intensity per area, shape and / or activation. The projector may be configured to perform said individual control for generating spots of equal intensity per area for the spatially-emitted spot pattern independent of the position of the spot in the spatially-emitted spot pattern. The projector can have an FWHM (full width at half maximum) of the irradiation wavelength of 1 nanometer or less. The projected spots of the spatial irradiation pattern can have a divergence of no more than 3 mrad. The at least one detector can be a CMOS or a CCD sensor. The detector may comprise a narrow bandpass spectral filter positioned in front of the detector, the narrow bandpass spectral filter being aligned with respect to the irradiation wavelength. The narrow bandpass filter may comprise a dome-shaped optical element with a filter coated thereon. The detector can include an array of micro lenses that change the angle of incidence on a narrow bandpass filter for incoming reflected radiation. The detector can include detector optics with a telecentric lens design. The at least one detector can be a plurality of detectors, e.g., two detectors or multiple detectors. The system may comprise a shutter, wherein the shutter, in the closed state, blocks incident radiation on the detector, and wherein the synchronizing means is adapted to synchronize the pulses of the at least one radiation source with the opening and closing of the shutter. The detector may include a variable attenuation filter or pupil for equalizing spot intensity for spots located at different positions of the irradiation pattern. The processor can be configured to perform 3D reconstruction of a profile of an object. The processor can be adapted to process the detected data on the basis of triangulation. The processor may be adapted to determine a 3D characteristic of an object by determining a displacement of detected spots, detected with the at least one detector, with respect to predetermined reference spot positions. The processor can be adapted to perform multi-pixel fitting. The processor may be adapted to perform data processing on the basis of intensity analysis and / or spot size analysis and / or on the basis of difference analysis of multiple images from different cameras, images with different irradiation wavelength, images with and without spots and / or wherein the data processing is adapted to pre-filter the recorded data to create an equal intensity of spots for the spots in the irradiation pattern. The obtained data can be used as input for a more refined model-based fitting of a characteristic, eg a profile or property, of the road or object, such as "spline fitting" and even for more advanced models for the movement of the object . In addition to a triangle measurement principle based on a single detector, embodiments of the present invention can also use a parallax between different detectors. Furthermore, embodiments of the present invention can optionally also be combined with stereo vision. The system may include an interface for outputting information obtained. The at least one projector and the at least one detector have a different angle orientation with respect to the object being irradiated. The present invention also relates to a vehicle comprising a system as described above. In the vehicle, the system may be mounted on top of the vehicle at one or a plurality of locations to cover a predetermined field of view and / or in which the system is fully integrated into existing cavities or newly created spaces of the vehicle. The predetermined field of view can be a 360 ° field of view on the environment in the plane in which the vehicle is moving. The vehicle may have an adjustable suspension, the vehicle comprising a spring system, and a control system, wherein the control system is adapted to receive profile information from the system for determining a characteristic of an object and is adapted to display the characteristic use, e.g. a profile or a property, of the object for controlling the spring system. The present invention also relates to a camera, wherein the camera comprises a system as described above, wherein the system is adapted to add 3D information to the camera image, which makes it possible to create a 3D image. The present maintenance further relates to a method of detecting a characteristic, e.g., a profile or a property, of an object, a scene or a portion thereof, the method comprising transmitting a pulsed radiation pattern to the object using of at least one projector, the radiation pattern being a spatial distribution of simultaneously emitted spots, detecting the reflected pattern using at least one detector having a plurality of pixels, the detection being synchronized with the pulsed radiation pattern, and processing the data from the at least one detector for determining a characteristic of an object based on spot shift calculation. The present invention also relates to the use of a system as described above for measuring a profile, a texture and / or a condition of the road ahead of a car. Such condition may include the presence of mud, ice, water, etc. The present invention also relates to the use of a system as described above for driving autonomous vehicles. The present invention also relates to the use of a system as described above for 3D construction of the road, as input for the control of active suspension of a vehicle. The present invention also relates to the use of a system as described above for determining a dynamic vehicle parameter such as rolling motion, tilt or speed. Specific and preferred aspects of the invention are included in the appended independent and dependent claims. Features of the dependent claims can be combined with features of the independent claims and with features of other dependent claims as appropriate and not merely as explicitly stated in the claims. These and other aspects of the invention will become apparent from and clarified with reference to the embodiment (s) described below. BRIEF DESCRIPTION OF THE FIGURES FIG. 1 shows a schematic overview of various components and their interactions in an exemplary system according to embodiments of the present invention. FIG. 2 shows a schematic representation of an exemplary method for obtaining a characteristic, according to an embodiment of the present invention. FIG. 3 shows a schematic representation of a triangle measurement principle as can be used in an embodiment according to the present invention. The figures are only schematic and non-limiting. In the figures, the dimensions of some parts may be exaggerated and not represented to scale for illustrative purposes. Reference numbers in the claims may not be interpreted to limit the scope of protection. Detailed description of embodiments of the invention The present invention will be described with reference to particular embodiments and with reference to certain drawings, however, the invention is not limited thereto but is only limited by the claims. The figures are only schematic and non-limiting. In the figures, the dimensions of some parts may be exaggerated and not represented to scale for illustrative purposes. The dimensions and the relative dimensions sometimes do not correspond to the current practical embodiment of the invention. Furthermore, the terms first, second and the like in the description and in the claims are used to distinguish similar elements and not necessarily for describing a sequence, neither over time, nor spatially, nor in ranking, or in any other way. It is to be understood that the terms used in this way are suitable under interchangeable conditions and that the embodiments of the invention described herein are capable of operating in a different order than described or depicted herein. In addition, the terms above, below and the like in the description and claims are used for description purposes and not necessarily to describe relative positions. It is to be understood that the terms used in this way are suitable under interchangeable conditions and that the embodiments of the invention described herein are capable of operating in orientations other than those described or shown herein. It is to be noted that the term "comprises", as used in the claims, is not to be interpreted as being limited to the means described thereafter; this term does not exclude other elements or steps. It can therefore be interpreted as specifying the presence of the listed features, values, steps or components referred to, but does not exclude the presence or addition of one or more other features, values, steps or components, or groups thereof. Thus, the scope of the expression "a device comprising means A and B" should not be limited to devices that consist only of components A and B. It means that with regard to the present invention, A and B are the only relevant components of the device. Reference throughout this specification to "one embodiment" or "an embodiment" means that a specific feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, occurrence of the expressions "in one embodiment" or "in an embodiment" at various places throughout this specification may not necessarily all refer to the same embodiment, but may do so. Furthermore, the specific features, structures, or characteristics may be combined in any suitable manner, as would be apparent to those skilled in the art based on this disclosure, in one or more embodiments. Similarly, it should be appreciated that in the description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together into a single embodiment, figure, or description thereof for the purpose of streamlining disclosure and assisting in understanding one or several of the various inventive aspects. This method of disclosure should not be interpreted in any way as a reflection of an intention that the invention requires more features than explicitly mentioned in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all the features of a single prior disclosed embodiment. Thus, the claims following the detailed description are hereby explicitly included in this detailed description, with each independent claim as a separate embodiment of the present invention. Furthermore, while some embodiments described herein include some, but not other, features included in other embodiments, combinations of features of different embodiments are intended to be within the scope of the invention, and constitute different embodiments, as would be understood by those skilled in the art . For example, in the following claims, any of the described embodiments can be used in any combination. Numerous specific details are set forth in the description provided here. It is, however, understood that embodiments of the invention can be practiced without these specific details. In other cases, well-known methods, structures and techniques have not been shown in detail to keep this description clear. Where in embodiments of the present invention reference is made to an object, reference is made to objects that are stationary or moving relative to the vehicle, including road, road signaling; people using the road as pedestrians, cyclists, etc .; animals; other vehicles; substances on the surface of the road, such as water, sand, mud, leaves, snow, ice, dirt, grit and the like; in agricultural applications, crops, mowed grass or hay (in bales or loose) on the ground. Where in embodiments of the present invention reference is made to a vehicle, reference is made to cars, trucks, trains, fork lifts, etc., regardless of whether they are driven by a combustion engine, an electric motor, or the like, and regardless of their connection to the ground. Where in embodiments of the present invention reference is made to the "near-infrared region", then reference is made to radiation with a wavelength between 700 and 1500 nm, and in particular between 800 and 1000 nm. Where in embodiments of the present invention reference is made to a "radiation or irradiation pattern", the radiation or irradiation pattern is physically or logically composed of radiation spots (e.g. light spots, light spots) characterized by their spot size and intensity. Where in embodiments of the present invention reference is made to "triangulation", reference is made to the use of observation of an object at an angle and determining the distance to the spot based on a shift of a reference position and an observed position of a corresponding spot in the image seen by the camera. More generally, techniques can be used based on the determination of the distance to a spot based on a shift between a reference position and a observed position of a corresponding spot in the viewed image, also referred to as "spot shift calculation". In a first aspect, the present invention relates to a system for detecting a characteristic, e.g., a profile or property, of an object, a scene, or a portion thereof. Such a property can be, for example, a texture or state of a road such as the presence of mud, ice, water, the presence of one or more objects in a scene surveyed in the environment, etc. The object can also be a vehicle that has a system as described in the first aspect. A schematic overview of various components included in the system according to embodiments of the present invention are shown in FIG. 1. FIG. 1 shows a projector 101, which includes a radiation source for generating a pulsed radiation pattern. Although only one detector is shown, a plurality of detectors can also be used. The radiation pattern is advantageously a spot pattern. The system can be mounted in any suitable position. For example, if the system is mounted on a vehicle, it may be on its roof, on its side, or may be integrated into existing cavities and openings, such as, for example, cavities for the headlights or rear lights, or in the headlights or rear lights themselves, on or near the license plate, etc. The system can be mounted in cavities so that seamless integration can be achieved. The radiation pattern is reflected by the object to be studied, and is captured by a detector 102 also shown in FIG. 1. The radiation source 101 and the detector 102 are synchronized by a synchronizing means 104. In embodiments according to the present invention, a shutter 105 may also be present, the shutter 105 also being synchronized by the synchronizing means 104 so that it blocks radiation to the detector 102 for as long as no radiation pattern is emitted. Alternatively, the detector can be over-sampled, and only the samples corresponding to time zones in which pulses are sent are considered. Systems according to embodiments of the present invention can comprise one or more radiation sources. The processor 103 in FIG. 1 processes the data coming from the detector 102, thereby revealing profile information of the object being examined. The processing may advantageously be based on vector shift based calculation method, such as triangulation. The processor may be adapted to determine a characteristic of an object by determining a displacement of detected spots, the spots being detected with the at least one detector, with respect to predetermined reference spot positions. The triangle measurement principle used in embodiments of the present invention is illustrated by way of example in FIG. 3. The triangle measurement method used can be based on a single detector, although the invention should not be considered to be limited to systems comprising only a single detector. By way of illustration, without limiting embodiments of the present invention thereto, the various elements of an exemplary system according to an embodiment of the present invention will be further discussed. In embodiments of the present invention, the at least one projector is designed for projecting an irradiation pattern onto an object. The irradiation pattern is a spatial pattern of radiation spots. The radiation spots can be individual spots (e.g. dots), can be lines, can be specially shaped spots, etc. The projector 101 includes a radiation source designed to generate radiation, e.g. monochromatic radiation or radiation of a specific wavelength range (e.g. frequency band ), in the near-infrared region. The near-infrared region has the advantage that it is invisible to humans and that CMOS or CCD sensors are still sufficiently sensitive to radiation with a wavelength in this region. In this way the user is not distracted by the radiation. The at least one projector 101 is generally adapted to irradiate a field of view that extends over the entire object to be observed. To this end, optical elements can be provided, or they can form part of the at least one radiation source. In embodiments of the present invention, this object is a field of view located in front of a car. The object to be observed is irradiated using an irradiation pattern. In some embodiments of the present invention, this pattern is a regular or an irregular array of spots, e.g. sharp spots. The size of the spots can be in the order of 1/1000 of the range. So 1 cm at a distance of 10 meters. According to embodiments of the present invention, the irradiation pattern can be an m x n spot pattern, where m is at least 1, and n is at least 1. The spots can be of any geometric shape such as, for example, an oval, a line, a circle, a disk. The radiation pattern can be regular, i.e. a complete matrix, or can be irregular. The radiation pattern can be repetitive, random, etc. The arrangement of the spots in the radiation pattern can be selected as a function of the application. In embodiments of the present invention, the irradiation pattern can be induced by using laser beams. It is an advantage of embodiments of the present invention that laser beams can provide very high depth of field with simple optics. In some embodiments of the present invention, the radiation source present in the projector comprises one or more semiconductor lasers. These semiconductor lasers have a very good price-performance ratio. In embodiments of the present invention, large distance ranges can be bridged for projecting the pattern. Such a range may be between 1 m and 15 m, or even between 1 m and 30 m. To generate the irradiation pattern, the system may be equipped with an optical element, such as, for example, a prism-based microlens, to group the rays into an axb pattern, thereby increasing the spot intensity at the point of incident by a factor a * b compared to a single radius, (a and b are numbers of at least 1). All spots of the radiation pattern can be emitted simultaneously. Alternatively, only a sub-pattern of the pattern is generated simultaneously, and different sub-patterns are generated one after the other, so that the entire radiation pattern is built up over time. In this case, the irradiation pattern is not provided in a single exposure pulse. In some embodiments of the present invention where individual spots of a radiation pattern are emitted simultaneously, there are various options for realizing the radiation pattern. A single laser can be used in combination with a refractive grating to induce an array of spots. Different types of refraction grids can be used. For efficiency, a phase grid in a transparent material can be used. The ease of manufacture (simple and inexpensive) is increased if the phase grid has linear grooves in two perpendicular directions. The depth of the grooves can usually correspond to a phase shift of 180 °. The width of the grooves can be calculated so that the intensity envelope corresponds to the required field of vision. The field of view (FOV, field of view) of the source and of the detector are both on the order of 1 radial horizontal and 1 radial vertical. In some embodiments, microarray optics can be used to focus any laser spot of the laser, e.g., VCSEL. In other embodiments of the present invention, wherein individual spots of the radiation pattern are emitted simultaneously, a VCSEL array is used as the radiation source 101. Such VCSEL array can be an array with a low beam divergence. The size of the VCSEL array can be 10 x 10, but can also be larger. It is an advantage of embodiments that use VCSEL arrays that the geometry of the array and the shape of the spots can be adjusted depending on the application. The principle can be combined with scanning irradiation, as will be explained further below, so that an array of spots is subsequently emitted. In embodiments of the present invention, e.g. where vehicle applications with active suspension are harvested, the projection of the spot pattern is advantageously directed downwards, i.e., towards the road. To have a large range of visibility, the radiation power must be sufficiently high. The daylight in our country has an average capacity of around 500 watts / m2. So if one wants to surpass the daylight by a factor of 5, a capacity of 2500 Watt / m2 is required. If monochromatic light and a spectral filter are used, a factor of 100 can be recovered. So an optical power of 25 Watt / m2 is required. In a calculation example this becomes at a distance Z of 30m, a divergence of lmrad, a height of the projector at lm is the projected spot size 270cm2 so this requires and optical power per spot of 0.67 Watt taking into account that we want to exceed 25W / m2. The condition of eye safety must also be met, on average lmW that may not be exceeded. This can be achieved by working in a pulsed way. So the duty cycle cannot exceed 1/700. With a CMOS camera of 60Hz, the frame time is 16ms, which corresponds to a pulse time of 22ps (16ms / 700). In another common calculation example this becomes at a distance Z of 20m, a divergence of lmrad, a height of the projector at lm is the projected spot size 10cm2 so this requires and optical power per spot of 0.025 Watt taking into account that we want 25W / m2 transcend. The condition of eye safety must also be met, on average lmW that may not be exceeded. This can be achieved by working in a pulsed way. So the duty cycle cannot exceed 0.001 / 0.025. With a CMOS camera of 60Hz the frame time is 16ms, this corresponds to a pulse time of 640ps). Thus, according to embodiments of the present invention, the irradiation is performed for short periods of time, i.e., in a pulsed mode. This has the advantage that a large radiation power can be generated - in pulsed radiation mode the instantaneous intensity can be significantly higher than in continuous mode - but that at the same time the average radiation power can be kept low. In the specific example in which a spot-like pattern is used for the irradiation, the intensity is locally concentrated at certain positions of the object, which also reduces the total amount of power required by the radiation source of the projector 101 to increase the intensity of the daylight. Advantageously, the spot size can be selected in relation to the pixel size of the detector 102. As indicated above, the peak power of the radiation source must be sufficient so that the intensity in each spot of the projected pattern exceeds the daylight. But the average (continuous) power of the individual spots may not exceed the safety requirement of 1 milliwatt. So for a pattern of 100 x 100 spots, the total average power of one laser may not exceed 10 watts. This high peak power versus low average power can be achieved by using short pulse times of the order of 50 microseconds. Another requirement for practical use of a laser is that the electrical power of the laser is not greater than 50 watts. But with an average optical power of 10 Watt, this requirement can easily be met for semiconductor lasers that have a high efficiency. If a VCSEL laser array is used instead of a single laser, the power requirement of one laser must be met by the array as a whole. In some embodiments, at least two projectors may be included, one projector being fixed, and another projector being steerable on at least one axis, and preferably steerable on two axes, to operate in a smaller field of view than the fixed projector. The steerable projector can therefore be suitable for zooming. Such a function can be used to detect certain objects in a scene, such as holes, birds, children, etc. in more detail. In some embodiments, a spot pattern can be introduced such that some groups of spots or each spot is addressable and configurable in terms of intensity, spot size, etc. This allows grouping of spots, introducing irregularities, e.g., to simplify image processing. run, changing the distance between lines ie making the distance variable, etc. In some specific embodiments, the intensity of the irradiation sources, e.g., VCSELs, can be controlled such that all spots have the same shape, same intensity regardless of the distance or position in the projected spot pattern. The latter improves data processing and allows to obtain a more accurate measurement, for example, to avoid amplitude limitation (clipping) of pixel load, and to allow multi-pixel fitting for all spots in the spot pattern. It also makes it possible to fit a curve between profiles that have roughly the same geometry. In some specific embodiments, the radiation source, e.g., VCSEL, can be split into multiple zones, and the different zones can be controlled differently to compensate, e.g., for an intensity decrease for spots further away from the source and / or the detector. For example, different zones can be controlled with different power or ON time to partially or fully compensate for the different intensity loss of the different spots, depending on the position where they are displayed. The latter therefore allows spots farther away from the detector to be driven with a higher power than spots closer to the detector. The images of the spots can therefore be controlled to have a constant intensity, for example 2 / 3rd of the Analog-Digital conversion range. Optimal use can be made of Analog-Digital conversion ranges, where necessary. The above-mentioned measures may be referred to as the use of a servo loop for saturation avoidance. In one example, the control of one or more spot properties can also be performed to compensate for objects that reflect differently, which could otherwise result in considerably different geometries of the reflected spots. For example, white lines for guiding traffic on the road have considerably different reflectivity from other parts of the road. The system may be adapted to compensate for geometry changes caused by the reflected spot. The above-mentioned check can be called intensity servo check. In some embodiments, the wavelength of the radiation source used can be controlled (thermal servo loop), so that it optimally matches the filters used, e.g. bandpass filters. The wavelength shift can, for example, be carried out by using a peltier or a heating element. A good correspondence between the wavelength of the radiation source and the filters used can result in the possibility of reducing and / or minimizing the influence of disturbing ambient radiation such as daylight. After the object to be studied is irradiated by the at least one radiation source 101, the reflected radiation is detected by the detector 102. The detector 102 may e.g. be a CMOS or CCD detector, e.g. a high resolution CMOS or CCD megapixel camera, although embodiments of the present invention are not limited thereto. The detector 102 can generally comprise a detector element and detector optics. In embodiments of the present invention, the detector 102 may include a plurality of N x N pixels, and the detector 102 optics is selected to match this large number of pixels with the angular resolution (1 / N = 1 millirad). In addition, the radiation source 101 and associated projector optics can be selected to have the same angular resolution as the detector (so that the spot size corresponds to the pixel size). In the case of a camera with 1 Megapixels, the number of measurements per second is also in the order of 1 million. If the vehicle is moving at a speed of 50 m / s and the road has a width of 20 m, the road will be sampled with a spatial resolution of 3 cm x 3 cm. To observe the same 30 m range using run-time based systems, a run time of 0.2 ps is required to cover the entire distance. During this time, the detector 102 is active and observes the entire field of view. This means that, with a fixed average power of 10 Watt, exposure is only possible over 1 / 100th of a second. This also means that, with a required detector opening time of 0.2 ps, only 10000 readings per second can be obtained. Therefore, embodiments of the present invention can observe measurement frequencies that are a factor of 20 higher than runtime-based systems. The detector optics used in the detector can be focused on the largest range to determine the spots at that distance and match the pixel size. However, because the depth of focus is much smaller than the range, the images of spots will be widened at shorter distances. On the other hand, the intensity of the spots increases with a smaller distance, so that the accuracy in determining the position increases. Calculations show that this easily compensates for the increased spot size. In embodiments of the present invention, the distance between the spots is selected such that overlap of spots on the detector 102 side is avoided. In an exemplary embodiment, the distance between the spots is ten times the diameter of the spots. With regard to the sensitivity of the detector, even with an isotropic reflection coefficient of 1/1000, the total number of detected photons per pixel is higher than the detection sensitivity of generally available detectors. To prevent interference with daylight, various precautions can also be taken on the detector side. In embodiments of the present invention, a narrow band spectral filter can be placed in front of the detector 102. The narrow band filter only transmits radiation in the specific wavelength range that is emitted and blocks the rest of the daylight spectrum. In some embodiments, a specific optical or filter design is implemented to allow the use of a narrow band pass filter, with an FWHM transmission range between, e.g., 1 nm and 5 nm, or, e.g., between 1 nm and 3 nm. The optical or filter design can be adapted to compensate for wavelength dependence of the filter caused by the angle of incidence. Different solutions are offered. In a first example, a spherical or spherical-like shape or spherical-like dome or spherical-like shell optical element with a filter coating is used. Below this spherical or spherical-shaped optical element, a conventional optical element is placed which allows imaging on the detector, e.g. CMOS camera. In a second example, optics are provided which comprise conical elements for orienting the incident radiation, so that the angle of incidence on the filter coating is substantially perpendicular, e.g. does not deviate more than 9 ° from the normal, and thus the length traveled by the radiation through the filter is virtually the same for all radiation. One particular example of this is the use of a micro-prism matrix placed in front of the narrow bandwidth filter, so that the radiation is incident within an angle of incidence between + 9 ° and -9 ° with respect to the normal on the filter. The prism filter can, for example, be manufactured by injection molding plastic. In a third example, a telecentric lens design is used. These examples all result in radiation that travels a substantially the same way through the filter medium or, in other words, that the incident radiation is perpendicular to the filter surface, which allows accurate filtering within a narrow bandwidth to, for example, filter daylight, sunlight, so that the spots can exceed daylight. In some embodiments of the present invention, the detector is rotated slightly relative to the projector, or vice versa, e.g., through an angle of 4 °, which allows for improved image processing and which facilitates the identification of the spots since the chance of overlap between the segments of the 2D projection on the image sensor of the 3D epipolar lines on which the spots are searched is reduced. In some embodiments of the present invention, the intensity of the spots can be kept substantially constant over the full depth range, by applying a stepped or variable attenuation filter to the detector. Alternatively or additionally, a non-symmetrical lens pupil can also be provided for attenuating the intensity of spots closer to the detector, while the intensity of the spots farther away from the detector are received at full intensity. In this way, amplitude limitation of the detector is avoided, and the average intensity can be made virtually constant for all spots. In embodiments of the present invention, a synchronizing means may be provided that synchronizes the detection with the pulses, resulting in the fact that unwanted radiation on the detector outside the pulse time of the radiation source can be avoided. This can be implemented in many ways, for example by over-sampling by the detector and by only taking into account the samples where a response to a pulse is expected. Alternatively, a shutter 105 may be used placed in front of the detector 102. The synchronizing means 104 then opens the shutter 105 during the time window that the reflected radiation arrives at the detector 102. To that end, the synchronizing means 104 receives its synchronizing input signal from the radiation source 101. The time window depends on the pulse width and the object range. In embodiments of the present invention, the system also includes a processor 103 for processing the data received by the detector 102. The detected radiation pattern can e.g. be analyzed by triangulation. For example, if a spot pattern is used, the spot pattern with a detector is observed at a different angle. The distance traveled by a light beam, which is part of the radiation pattern, to a certain spot can then be determined by vector-shift calculation, such as eg triangulation from the shift between the theoretical and the observed position of the corresponding spot in the image seen through the camera. For triangular measurement, the radiation source 101, the detector 102, and the object under investigation form a triangle. The line between the detector and the radiation source 101 is known. The radiation angle is known, which makes it possible to determine distances. The position of a spot can be determined by image processing, whereby the weighted center of intensity of the pixels is calculated (cf. mass center). The calculation of an intensity center is very simple and can be performed in real time with simple and inexpensive hardware. If the angular resolution of the radiation source 101 and the detector 102 are worse than 1 / N, the observed spot size in the image is larger than one pixel. But since the spot profile is known, one can in principle obtain sub-pixel accuracy through multi-pixel fitting of the entire spot. In one embodiment of the present invention, the theoretical resolution can be calculated using the following formulas. In the example discussed below, a pattern of light spots is generated by a projector 101, and the observed object is the road ahead of a car, where: - D is the distance between the radiation source 101 and the detector 102 - Z is the range over which the path is observed - N is the number of pixels of the detector 102 in both directions - The angle resolution of the projection and of the detection are 1 / N - The opening angle of the detector and optional lenses 1 is steradial - H is the height of the projector above the road The achievable resolution can be subdivided into: - d: the distance resolution - v: the vertical resolution The theoretical distance resolution can be calculated as: The theoretical vertical resolution in the road profile can be calculated as: So for the example of a 4 Megapixel detector (N = 2000), with D = 2 m, H = 1 m and Z = 20 m, a distance resolution of 10 cm (0.5%) and a vertical resolution of 5 mm can be obtained to become. At a distance of 1 m, the distance resolution is 0.5 mm and the vertical resolution is also 0.5 mm. As can be seen in the formulas, both the distance resolution and the vertical resolution are inversely proportional to the distance D between the radiation source 101 and the detector 102. Since this distance can be made much larger than the distance between the lenses in a 3D camera, the depth resolution is therefore also relatively better for the same exposure. In one example, a spot profile design is used in which a spot spacing of 10 spot diameters is present. This then corresponds to a grid of N / 10 x N / 10 spots. For a 1 Megapixel camera, the grid consists of 100 x 100 points. With a frame rate of 100 Hz, it is possible to efficiently sample the path in front of the tires with a vertical resolution of approximately 1 mm and a lateral sampling distance of 1 cm. The sampling distance in the direction of movement depends on the speed of the vehicle. At the maximum speed of 50 m / s, the sampling distance in the direction of movement is of the order of 5 mm. This is more than sufficient for most applications that can be considered, which makes the concept very generic. In embodiments of the present invention, the data from the detector 102 may be processed on binarized images. It is an advantage of embodiments of the present invention that the required computing power of the processor 103 for processing the data is limited. In some embodiments, the processor may be adapted to calculate a relationship between successive images. The images need not be directly consecutive images, in other words, some images may have been skipped. Calculating the relationship may be based on correlation, least squares approach, etc., but is not limited to that. As will also be described further, such relationship may serve to determine a property of the moving object, e.g., vehicle, e.g., to determine a dynamic property. The relation can be used, for example, to determine a speed, an inclined position, a rolling movement, a rotation, etc. of a moving object. Such a relationship can also be used to determine a height measurement of complete surfaces of the object or parts of the object. The object can be in relative movement or can be stationary. In some embodiments, the processor may alternatively or in addition to other tasks, be configured to interpret image or pixel information for different applications. Such different applications can be, but are not limited to, pavement detection, road detection, road inspection, inspection of the condition of the pavement, detection of obstacles such as living things or objects, moving or stationary obstacles, etc. Detection of objects in relation to their environment can also be considered. In some embodiments, the system may be adapted to provide odometry information. The processor may be adapted to determine equivalent odometry information using successive detector images to estimate distances traveled. The latter can be used, for example, for improved navigation accuracy in robots or vehicles that use any form of propulsion on any type of surface. By way of illustration, without limiting embodiments of the present invention thereto, some optional features will be described in detail below. Such features may relate to more than one component of the system. As indicated above, the system may be adapted in some embodiments to eliminate or minimize reflection of sunlight and / or to reduce directional reflection of sunlight in reflective surfaces such as water. One or more of the at least one radiation source, the detector and / or the processor may be adapted for this. In one embodiment, the at least one radiation source may be adapted to generate radiation with one or more predetermined polarization state (s), and the at least one detector may be adapted to receive only radiation from that one or more predetermined polarization state (s). The latter can be obtained by providing a polarization filter in front of the at least one detector for selecting the one or more predetermined polarization state (s). In another embodiment, the at least one radiation source may be adapted to generate radiation with one or more predetermined wavelength (s) that may be within or outside the water spectrum, and the detector may be adapted, e.g., by using of wavelength filters, mainly to limit the detection to one or more predetermined wavelength (s). Alternatively, these wavelength (s) can be selected at the level of the processor from images recorded with a larger wavelength spectrum. Another alternative is that the at least one radiation source is not specially adapted, but that the at least one detector filters one or more predetermined wavelength (s) from the image, which may lie within or outside the water spectrum. Filtering can also be done at the processor level from images recorded with a larger wavelength spectrum. In yet another embodiment, the processor used may be adapted to eliminate daylight or sunlight from an image by image processing, e.g., by correcting images with an adjusted daylight / sunlight reference contribution, so as to eliminate daylight / sunlight. In some embodiments, the at least one radiation source may be adapted to emit at least two different wavelengths, i.e., to create a multispectral radiation pattern. The latter can be advantageous for the detection of certain objects, such as for example the detection of organic material, the detection of ice, water, snow, etc. Similar to blocking sunlight, the detector can usually also be adapted with a filter for filtering the corresponding wavelengths emitted from the radiation source to operate selectively. In some embodiments, a corresponding filter can be provided for each radiation wavelength used. The predetermined wavelengths can each be determined separately, or in a single detection. In one embodiment, the processor may also or alternatively also be adapted to filter these wavelengths from an image recorded in a wider wavelength range. The information recorded at specific wavelengths can thus be used to selectively detect certain objects that are more sensitive to these specific wavelengths. In some embodiments, the characteristic detection system may also be adapted to perform a differential measurement. The system can use at least two detectors that are spatially removed. The system can be programmed to subtract images from two different detectors for eliminating sunlight. In one example, one of the detectors can take an image without spots and a second detector can take an image with spots, and the images (or rather the pixel values thereof) can be divided or subtracted to obtain a differential result. A controller can be present and configured to perform differential measurements. Such a controller can control the at least two detectors and / or the at least one radiation source (e.g. synchronize the image recording). An image processing unit may also be present for processing the images from the at least two detectors to obtain a differential measurement, e.g., by subtracting or dividing the images. When use is made of differential measurements, images can be recorded simultaneously by several detectors. In this way, identical images can be obtained for subtraction or division, in which the radiation from the radiation source is included in one image, while this is not the case for the other image. Another method for differential measurements is based on different wavelengths where the projector uses band pass filters based on different wavelengths. In some embodiments, the present invention also relates to a system that includes a plurality of systems as described above. The latter can allow you to cover a larger field of vision. The multitude of systems includes as many systems as necessary to cover the field of vision in which one is interested. The plurality of systems can thereby be mounted such that they cover the field of interest of interest. This can be a system that covers 360 ° horizontally and 180 ° vertically to achieve a surround view. Alternatively, any other size of field of view can be obtained by appropriately choosing the number of systems and their positions. Advantageously, the field of vision realized by repeating the system at several physically different locations can be organized specifically for certain applications such as, for example, but not exclusively, for autonomous vehicles, warehouse transport, inspection vehicles, etc. Alternatively or additionally, one or more systems or components thereof are adaptable to cover different fields of vision. The latter can be achieved by making one or more systems movable to different positions, e.g. to different positions around a car in vehicle applications, by tilting the system at different angles, e.g. by changing the mounting angle, by a system controllable, by creating an overlap between the field of vision of systems, or a combination thereof. In some embodiments, the system may be adapted to cover a field of view from near the vehicle to the radar range. Therefore, the system can be combined with radar measurement to extend the range for certain applications, such as pedestrian detection, collision avoidance or other safety and autonomous vehicle related applications, or to provide additional and / or complementary measurement data for the benefit of the application. More generally, the system can be combined with additional sensors such as radar (s), infrared camera sensor (s), etc. In embodiments of the present invention, the distance between the detector and the projector is advantageously not too small to allow accurate image processing. In a second aspect, the present invention relates to vehicles in which the detection system 100 is implemented for observing road conditions and as input for checking the vehicle's suspension system. Characteristics and advantages of vehicles according to embodiments of the present invention may correspond to characteristics and advantages of systems according to embodiments of the first aspect of the present invention. In some embodiments of the present invention, the system for detecting a characteristic, e.g., a profile or characteristic, of an object is placed in front of the vehicle. The object to be studied is in that case the road in front of the vehicle. Ranges from 1 m to 15 m, even from 1 m to 30 m in front of the vehicle can be observed. More generally, the system can be positioned at one or more positions to cover the required field of vision. Furthermore, the system can be integrated into existing or new cavities in the vehicle. Such integration can be seamless. It is an advantage of embodiments of the present invention that the characterization system can be combined with other techniques to expand the possibilities. For example, in one embodiment, the system may be expanded with a radar system to extend the range that can be observed. A multitude of applications can be considered. In a first application, determining a characteristic of an object involves determining a profile of a road to use this information for active vehicle suspension. Compensation of the active suspension is based on the average height in a zone with an area the same size as the area of the contact surface of the tire on the road. Such compensation can be performed approximately 50 to 100 ms before the event. The distance from the relevant zone to be observed is a function of the speed of the vehicle. The lateral position (left - right) of the relevant zone is a function of the steering angle. The transformation to be performed for the three degrees of freedom (X, Y and theta) can be identified based on an optimal correlation between two consecutive images. This transformation is then used to determine a new set of height measurements. Thus, in advantageous embodiments, the detection system 100 can interface with the suspension system of a vehicle. Such interfacing can be done through an interface or output means. The data can be communicated to a controller for controlling the active suspension system. The result is that the vehicle moves gently on the road. Such a soft ride can also be called "flying carpet". The application can therefore increase driving comfort. Active suspension is controlled based on the measurements obtained to adjust to road imperfections, but also to adapt to the type of pavement, to adapt to road conditions such as ice, water, snow, gravel, etc., to adjust the tilt, tilt, etc. It is noted that the image of the scene to be observed is repeatedly taken at the frame rate of the camera. So the characteristic of the road is constantly updated and fed back to the active suspension system. In one particular example, "steering" may include steering a suspension such that the tilt vector is in the same direction as the centrifugal force, so that passengers are pressed against their seats. In another example, "steering" may include actively lifting the chassis of the vehicle during braking, such that a greater downward force on the vehicle and thus a greater frictional force on the tires is obtained, resulting in a shorter braking distance. Yet another example of 'steering' is maximizing the tilt of a vehicle for controlled collision damage, resulting in an increased front of the vehicle to limit the risk of sliding under a truck. In other circumstances, 'steering' is such that fuel consumption is minimized by reducing the tilt of a vehicle and therefore reducing the air resistance. In other words, in some embodiments, the system can be used to obtain a characteristic of the vehicle itself and to control such a characteristic. The rolling motion and / or tilt of the vehicle can be identified on the basis of the six degree of freedom transformation of the vehicle coordinate system with respect to a least squares plane fitting through the measurement points. As the above application shows, another intended application is the provision of safety information or the implementation of safety actions. The system can be used to detect traffic lights, moving objects, proximity to objects, detection of any moving or stationary object with regard to its environment, road conditions, road type, etc. In embodiments of the present invention, the depth information of the road ahead of the car and detected by the detection system can be used as a 3D component that can be added to the image of a color camera to display this image as a 3D image. One application can therefore be to provide an image of the profile of the road ahead of the car. It is an advantage of embodiments of the present invention that the system can also be used to alert the driver to poor conditions such as, for example, rain or snow. In other words, a characteristic of the road, different from the profile, can also be used to display relevant information to the driver. In yet another application, the system can be used for autonomous or assisted driving of vehicles. The system can be adapted to perform ego motion-related applications. The processor may be adapted to provide data useful for ego motion-related applications. The projector, i.e. the radiation source, and / or the detector can also be provided for creating / recording relevant data for ego motion-related applications. In one particular example, ego motion-related data may be based on the motion of a vehicle based on its motion relative to lines on the road or relative to street name signs as viewed from the vehicle itself. Ego motion-related information can, for example, be used in autonomous navigation applications. The system can be used, for example, to provide or assist in a control function. It can provide control instructions for semi-autonomous or autonomous control of vehicles of any type, eg in agriculture, for any type of transport, for roadside control, for a garbage collection truck, for collecting and collecting objects in a warehouse, for load applications, etc. A processor can be used to derive the speed of the car from the data from the detection system. The orientation of the car relative to the road can also be derived from the data from the detection system. The orientation can then be expressed as the car that forms an angle with the central line of the road, or in any other suitable way. A processor adapted to derive such information may be embedded in the detection system or in another system, e.g., car control system. The system can also perform one or more from speed calculation, determination of moving or stationary objects with respect to their environment, etc. A processor adapted to derive such information may be embedded in the detection system or in another system, e.g. the car. The output of the detection system mounted on a vehicle can therefore be used, not only for the suspension system, but also as input for various active components in a car, which allows the car to be autonomously controlled. In a third aspect, the present invention relates to a method for detecting a characteristic, e.g., a profile or property, of an object. Such a method can advantageously be used in combination with a system as described in the first aspect, although embodiments of the present invention are not limited thereto. The method can advantageously be used to determine a characteristic such as a road profile. The method may be built into a method for controlling a car suspension, the method comprising performing a method for detecting a characteristic, e.g. a profile or characteristic, of a road, and a step of using such information for actively controlling a suspension. Nevertheless, the method for detecting a characteristic, e.g. a profile or a property, of an object can also be used outside of automotive applications, for any other suitable application. In a first step 201, a radiation pattern is transmitted to an object to be observed, using at least one projector that includes a radiation source. The radiation pattern can be emitted in one go (i.e., simultaneously). In any case, the radiation pattern is transmitted in a pulsed manner. The latter advantageously results in a better signal-to-noise resolution, since the amount of power that can be supplied by the pulse can be greater, and thus can result in less disturbance by the ambient light, such as, for example, daylight. In a second step 202, the reflected pattern is detected using a detector 102. The detector has a plurality of pixels so that it can detect the entire reflected radiation pattern with a resolution sufficient to determine the pattern. The detection is synchronized with the radiation by means of a synchronizing means 104. In a third step 203, the data from the detector 102 is processed using a processor 103. Triangle measurement-based methods performed on the data make it possible to obtain profile information from that data. The method can include an auto-calibration phase. A schematic representation of a method according to embodiments of the present invention is shown in FIG. 2. Although the aforementioned method has been described for detecting a characteristic of an object, e.g., a profile in a path, it will be apparent to those skilled in the art that a number of other applications can be performed using the present method. Specific applications can be based on a specific interpretation of the measured raw data in combination with, for example, calibration data sets. Examples of applications are given in the second aspect and correspond to them. Accordingly, in one aspect, the present invention relates to a method for characterizing an object or scene, the method comprising the steps of sending a pulsed radiation pattern to the object or scene using at least one radiation source, detecting of the reflected pattern using at least one detector having a plurality of pixels, the detection being synchronized with the pulsed radiation pattern for detecting radiation to be recorded only during the radiation pulses, and processing data from the detector for determining a property of the object of the scene. The method may further comprise features and advantages according to methods as described above, or may express the functionality of features of systems described above.
权利要求:
Claims (35) [1] Conclusions A system (100) for detecting a characteristic of an object, wherein the system comprises: - a radiation source (101) for projecting a simultaneously pulsed radiation pattern, - at least one detector (102) provided with a plurality of pixels for detecting, a reflection of the irradiation pattern reflected by an object, - a processor (103) for processing the detected data detected by the at least one detector (102), and - a synchronizing means (104) interfacet between the at least one detector (102) and the radiation source (101) and which is provided for synchronizing the at least one detector (102) with the at least one projector (101) so that detection by the detector of radiation to be processed is only detected during the radiation pulses, characterized in that the radiation source (101) comprises at least one laser, arranged for generating a plurality of laser spots that the irradiation p pattern, the processor (103) is configured to process the detected data based on a shift between a reference position of a spot and a observed position of a corresponding spot in the reflection detected by the detector (102), and that the processor is adapted to perform multi-pixel fitting. [2] A system (100) according to claim 1, wherein the projector (101) is based on a radiation source that provides monochromatic radiation in the near-infrared spectrum. [3] A system (100) according to any one of the preceding claims, wherein the at least one projector comprises at least one laser and at least one optical grid for generating a plurality of laser radiation spots from the irradiation pattern. [4] A system (100) according to any one of the preceding claims, wherein the projector comprises a semiconductor laser. [5] A system (100) according to any one of the preceding claims, wherein the projector (101) comprises a multi-laser radiation source. [6] A system (100) according to any one of the preceding claims, wherein the projector (101) comprises a single VCSEL source or a VCSEL array. [7] A system (100) according to any one of the preceding claims, wherein the projector (101) comprises projector optics for projecting the irradiation pattern to the requested field of view. [8] A system (100) according to claim 7, wherein the projector optics comprises microarray optics for projecting individual spots of the irradiation pattern and / or wherein the projector optics comprises a diverging lens system for widening the irradiation pattern for the realizing a predetermined field of vision. [9] A system (100) according to any one of the preceding claims, wherein the projector is configured to be mechanically oriented to the field of interest of interest. [10] A system (100) according to any one of the preceding claims, wherein the projector is configured to project a subset of the spots of the irradiation pattern. [11] A system (100) according to any one of the preceding claims, wherein the projector operating in pulsed mode is adapted to sequentially repeat the projection of the irradiation pattern of spots or a part thereof. [12] A system (100) according to any one of the preceding claims, wherein the projector is configured to individually control spots or subgroups of spots of the irradiation pattern with respect to their intensity per area, shape and / or activation. [13] A system (100) according to claim 12, wherein the projector is configured to perform said individual driver for generating spots of equal intensity per area for the spatially radiated spot pattern regardless of the position of the spot in the spot spatially radiated spot pattern. [14] A system (100) according to any preceding claim, wherein the projector has a radiation wavelength FWHM of 1 nanometer or less and / or wherein the projected spots of the spatial irradiation pattern have a divergence of no more than 3 mrad. [15] A system (100) according to any one of the preceding claims, wherein the at least one detector is a CMOS or CCD sensor. [16] A system (100) according to any one of the preceding claims, wherein the detector comprises a narrow bandpass spectral filter positioned in front of the detector (102), wherein the narrow bandpass spectral filter is aligned with respect to the irradiation wavelength. [17] A system (100) according to any one of the preceding claims, wherein the narrow band pass filter comprises a dome-shaped optical element with a filter coated thereon. [18] A system (100) according to any one of the preceding claims, wherein the detector comprises an array of micro lenses that change the incident angle on a narrow bandpass filter for incoming reflected radiation. [19] A system (100) according to any one of the preceding claims, wherein the detector comprises detector optics with a telecentric lens design. [20] A system (100) according to any one of the preceding claims, wherein the system comprises a shutter (105), wherein the shutter (105), in the closed state, blocks incident radiation on the at least one detector (102), and wherein the synchronizing means (104) is adapted to synchronize the pulses from the radiation source (101) with the opening and closing of the shutter (105). [21] A system (100) according to any one of the preceding claims, wherein the detector comprises a variable attenuation filter or pupil for equalizing spot intensity for spots located at different positions of the irradiation pattern. [22] A system (100) according to any one of the preceding claims, wherein the processor is configured to perform 3D reconstruction of a profile of an object. [23] A system (100) according to any one of the preceding claims, wherein the processor is adapted to process the detected data on the basis of triangulation. [24] A system (100) according to any one of the preceding claims, wherein the processor is provided for determining a 3D characteristic of an object by determining a displacement of detected spots, detected with the at least one detector, with respect to predetermined reference spot positions. [25] A system (100) according to any one of the preceding claims, wherein the processor is adapted to perform data processing based on intensity analysis and / or spot size analysis and / or based on difference analysis of multiple images of different images cameras, of images with different irradiation wavelengths, of images with and without spots, and / or where the data processing is adapted for pre-filtering the recorded data to create an equal intensity of spots for the spots in the irradiation pattern. [26] A system (100) according to any one of the preceding claims, wherein the at least one projector (101) and the at least one detector (102) have a different angular orientation with respect to the object being irradiated. [27] A vehicle, wherein the vehicle comprises a system (100) according to any one of claims 1 to 26. [28] A vehicle according to claim 27, wherein the system is mounted on top of the vehicle at one or a plurality of locations to cover a predetermined field of view and / or wherein the system is fully integrated into existing cavities or newly created spaces of the vehicle . [29] A vehicle according to any one of claims 27 to 28, wherein the vehicle has an adjustable suspension, which comprises a suspension system and a control system, the control system being adapted to receive profile information from the system for determining a characteristic of an object and is adapted to use the information for checking the suspension system. [30] A camera, wherein the camera comprises a system (100) according to any of claims 1 to 26, wherein the system (100) is adapted to add 3D information to the camera image based on information obtained by the system, which makes it possible to create a 3D image. [31] A method (100) for detecting a characteristic of an object, the method comprising the steps of: - sending (201) a pulsed radiation pattern to the object using at least one projector (101) comprising at least one at least one laser, wherein the radiation pattern is a spatial distribution of a plurality of simultaneously emitted laser spots, - detecting (202) the reflected pattern using a detector (102) having a plurality of pixels, the detection being synchronized with the pulsed radiation pattern for detecting radiation to be processed only during the radiation pulses, and - processing (203) the data from the detector (102) to determine a characteristic of an object based on a calculation of a shift between a reference position of a spot and a observed position of a corresponding spot in the ref detected by the detector (102) lection, where processing uses multi-pixel fitting. [32] Use of a system (100) according to any of claims 1 to 26 for measuring a profile, a texture and / or a condition of the road ahead of a car. [33] Use of a system (100) according to one of claims 1 to 26 for driving autonomous vehicles. [34] Use of a system (100) according to any of claims 1 to 26 for 3D construction of the road, as input for active suspension control of a vehicle. [35] Use of a system (100) according to any of claims 1 to 26 for determining a dynamic vehicle parameter such as rolling motion, tilt or speed.
类似技术:
公开号 | 公开日 | 专利标题 BE1021971B1|2016-01-29|ENVIRONMENTAL SENSOR SYSTEM US10183541B2|2019-01-22|Surround sensing system with telecentric optics CA2901100C|2021-05-11|System and method for scanning a surface and computer program implementing the method CN109997057B|2020-07-14|Laser radar system and method EP3045935A1|2016-07-20|Surround sensing system with dome-filter assembly KR102182392B1|2020-11-24|Long-distance controllable LIDAR system KR101953521B1|2019-02-28|A robust method for detecting traffic signals and their associated states US10634772B2|2020-04-28|Flash lidar with adaptive illumination KR20190125517A|2019-11-06|Methods and systems for detecting weather conditions using vehicle onboard sensors BE1023741B1|2017-07-07|A vehicle, a continuously variable transmission system, a control method and a computer program product US11041957B2|2021-06-22|Systems and methods for mitigating effects of high-reflectivity objects in LiDAR data EP2824418A1|2015-01-14|Surround sensing system EP1027636B1|2004-03-03|Method and device for locating and guiding a mobile unit equipped with a linear camera JP2022505772A|2022-01-14|Time-of-flight sensor with structured light illumination IL280398D0|2021-03-01|Hybrid time-of-flight and imager module Choi et al.2020|Optical system design for light detection and ranging with ultra-wide field-of-view using liquid lenses US20190220677A1|2019-07-18|Structured light illumination system for object detection JP2021501877A|2021-01-21|Equipment and method WO2020064674A1|2020-04-02|Sensor assembly for detecting an environment of a device, method for operating a sensor assembly for detecting an environment of a device, and device having the sensor assembly
同族专利:
公开号 | 公开日 CN105393083A|2016-03-09| JP2016530503A|2016-09-29| EP3019824B1|2018-01-10| WO2015004213A1|2015-01-15| US9551791B2|2017-01-24| CN105393083B|2018-07-13| KR20160029845A|2016-03-15| EP3019824A1|2016-05-18| US20160018526A1|2016-01-21| KR102327997B1|2021-11-17| JP6387407B2|2018-09-05|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20050195383A1|1994-05-23|2005-09-08|Breed David S.|Method for obtaining information about objects in a vehicular blind spot| US20070177011A1|2004-03-05|2007-08-02|Lewin Andrew C|Movement control system| US20120038903A1|2010-08-16|2012-02-16|Ball Aerospace & Technologies Corp.|Electronically steered flash lidar| US4757200A|1986-07-28|1988-07-12|Visidyne, Inc.|Pulsed radiation detection system| US5040116A|1988-09-06|1991-08-13|Transitions Research Corporation|Visual navigation and obstacle avoidance structured light system| WO1996041304A1|1995-06-07|1996-12-19|The Trustees Of Columbia University In The City Of New York|Apparatus and methods for determining the three-dimensional shape of an object using active illumination and relative blurring in two images due to defocus| JP2000283753A|1999-03-31|2000-10-13|Fuji Heavy Ind Ltd|Device for measuring distance using stereographic picture| JP4810763B2|2001-06-20|2011-11-09|株式会社デンソー|Distance measuring device| US7830442B2|2002-04-30|2010-11-09|ARETé ASSOCIATES|Compact economical lidar system| CN100420592C|2003-02-21|2008-09-24|金泰克斯公司|Automatic vehicle exterior light control system| JP2011112639A|2009-11-27|2011-06-09|Nippon Kensaki Kk|Three-dimensional shape measuring system| JP5620087B2|2009-11-30|2014-11-05|浜松ホトニクス株式会社|Distance sensor and distance image sensor| GB0921461D0|2009-12-08|2010-01-20|Qinetiq Ltd|Range based sensing| US8320621B2|2009-12-21|2012-11-27|Microsoft Corporation|Depth projector system with integrated VCSEL array| JP2011169701A|2010-02-17|2011-09-01|Sanyo Electric Co Ltd|Object detection device and information acquisition apparatus| US8494687B2|2010-03-12|2013-07-23|The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration|Method for enhancing a three dimensional image from a plurality of frames of flash LIDAR data| US20120203428A1|2011-02-08|2012-08-09|Honda Motor Co., Ltd|Road profile scanning method and vehicle using side facing sensors| EP2722646B1|2011-06-14|2021-06-09|Nissan Motor Co., Ltd.|Distance measurement device and environment map generation apparatus| US8797512B2|2011-09-15|2014-08-05|Advanced Scientific Concepts, Inc.|Automatic range corrected flash ladar camera| CN102750693B|2012-05-31|2014-11-05|重庆大学|Correction method for curve edge high-precision positioning based on Zernike moment|TWI542891B|2014-12-29|2016-07-21|原相科技股份有限公司|Method for optical distance measurement| KR20170132860A|2015-04-01|2017-12-04|바야비전 리미티드|Generate three-dimensional map of a scene using manual and active measurements| DE102015105393A1|2015-04-09|2016-10-13|Valeo Schalter Und Sensoren Gmbh|Laser sensor for a motor vehicle with a diffractive optical element, driver assistance system and motor vehicle| JP6547472B2|2015-07-10|2019-07-24|日本製鉄株式会社|Shape measuring device| US9992477B2|2015-09-24|2018-06-05|Ouster, Inc.|Optical system for collecting distance information within a field| EP3159711A1|2015-10-23|2017-04-26|Xenomatix NV|System and method for determining a distance to an object| US10302768B2|2016-05-09|2019-05-28|Microsoft Technology Licensing, Llc|Multipath signal removal in time-of-flight camera apparatus| CN105759280A|2016-05-17|2016-07-13|上海酷哇机器人有限公司|Laser triangulation system safe for human eyes| US10281564B2|2016-06-29|2019-05-07|Aptiv Technologies Limited|Refractive beam steering device useful for automated vehicle LIDAR| TWI571819B|2016-08-03|2017-02-21|Air detection system and method| AU2017315762B2|2016-08-24|2020-04-09|Ouster, Inc.|Optical system for collecting distance information within a field| US10834377B2|2016-08-29|2020-11-10|Faro Technologies, Inc.|Forensic three-dimensional measurement device| EP3301478A1|2016-10-03|2018-04-04|Xenomatix NV|System for determining a distance to an object| EP3301477A1|2016-10-03|2018-04-04|Xenomatix NV|System for determining a distance to an object| EP3301479A1|2016-10-03|2018-04-04|Xenomatix NV|Method for subtracting background light from an exposure value of a pixel in an imaging array, and pixel for use in same| EP3301480A1|2016-10-03|2018-04-04|Xenomatix NV|System and method for determining a distance to an object| EP3316000A1|2016-10-28|2018-05-02|Xenomatix NV|Vehicular system for measuring a distance to an object and method of installing same| EP3343246A1|2016-12-30|2018-07-04|Xenomatix NV|System for characterizing surroundings of a vehicle| US10445928B2|2017-02-11|2019-10-15|Vayavision Ltd.|Method and system for generating multidimensional maps of a scene using a plurality of sensors of various types| EP3596491A1|2017-03-16|2020-01-22|Fastree3D SA|Method and device for optimizing the use of multiple emitters and a detector in an active remote sensing application| EP3596490A1|2017-03-16|2020-01-22|Fastree3D SA|Method and device for optimizing the use of emitter and detector in an active remote sensing application| DE102017205685A1|2017-04-04|2018-10-04|Robert Bosch Gmbh|LIDAR device with a dynamic filter and method| EP3392674A1|2017-04-23|2018-10-24|Xenomatix NV|A pixel structure| JP2020521954A|2017-05-15|2020-07-27|アウスター インコーポレイテッド|Optical imaging transmitter with enhanced brightness| US11016192B2|2017-07-05|2021-05-25|Ouster, Inc.|Light ranging device with MEMS scanned emitter array and synchronized electronically scanned sensor array| CN109144052B|2017-07-07|2021-12-28|奥特埃克斯股份有限公司|Navigation system for autonomous vehicle and method thereof| DE102018118079A1|2017-10-09|2019-04-11|Osram Opto Semiconductors Gmbh|SEMICONDUCTOR LIGHT SOURCE, OPERATING PROCEDURES AND SPECTROMETERS| EP3625589B1|2017-12-15|2020-11-18|Xenomatix NV|System and method for determining a distance to an object| EP3550329A1|2018-04-04|2019-10-09|Xenomatix NV|System and method for determining a distance to an object| KR101939763B1|2018-07-23|2019-01-17|엘아이지넥스원 주식회사|System and Method for Tracking a Target using Multiple Sensors| US10739189B2|2018-08-09|2020-08-11|Ouster, Inc.|Multispectral ranging/imaging sensor arrays and systems| US20200116830A1|2018-08-09|2020-04-16|Ouster, Inc.|Channel-specific micro-optics for optical arrays| EP3859475A1|2018-09-25|2021-08-04|Chiba Institute of Technology|Information processing device and mobile robot| US11037382B2|2018-11-20|2021-06-15|Ford Global Technologies, Llc|System and method for evaluating operation of environmental sensing systems of vehicles| WO2021043851A1|2019-09-03|2021-03-11|Xenomatix Nv|Projector for a solid-state lidar system|
法律状态:
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 EP13175826.0A|EP2824418A1|2013-07-09|2013-07-09|Surround sensing system| EP131758260|2013-07-09| EP14156673|2014-02-25| EP14173065|2014-06-18| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|